Gradient algorithms for quadratic optimization with fast convergence rates

نویسندگان

  • Luc Pronzato
  • Anatoly A. Zhigljavsky
چکیده

We propose a family of gradient algorithms for minimizing a quadratic function f(x) = (Ax, x)/2− (x, y) in R or a Hilbert space, with simple rules for choosing the step-size at each iteration. We show that when the step-sizes are generated by a dynamical system with ergodic distribution having the arcsine density on a subinterval of the spectrum of A, the asymptotic rate of convergence of the algorithm can approach the (tight) bound on the rate of convergence of a conjugate gradient algorithm stopped before d iterations, with d ≤ ∞ the space dimension.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Benchmarking large-scale distributed convex quadratic programming algorithms

This paper aims to collect, benchmark and implement state-of-the-art decomposable convex quadratic programming methods employing duality. In order to decouple the original problem, these methods relax some constraints by introducing dual variables and apply a hierarchical optimization scheme. In the lower level of this scheme, a sequence of parametric quadratic programs is solved in parallel, w...

متن کامل

A Hybrid Particle Swarm - Gradient Algorithm for Global Structural Optimization

The particle swarm optimization (PSO) method is an instance of a successful application of the philosophy of bounded rationality and decentralized decision making for solving global optimization problems. A number of advantages with respect to other evolutionary algorithms are attributed to PSO making it a prospective candidate for optimum structural design. The PSO-based algorithm is robust an...

متن کامل

Fast Optimal H2 Model Reduction Algorithms Based on Grassmann Manifold Optimization

The optimal H2 model reduction is an important tool in studying dynamical systems of a large order and their numerical simulation. We formulate the reduction problem as a minimization problem over the Grassmann manifold. This allows us to develop a fast gradient flow algorithm suitable for large-scale optimal H2 model reduction problems. The proposed algorithm converges globally and the resulti...

متن کامل

Constant Nullspace Strong Convexity and Fast Convergence of Proximal Methods under High-Dimensional Settings

State of the art statistical estimators for high-dimensional problems take the form of regularized, and hence non-smooth, convex programs. A key facet of these statistical estimation problems is that these are typically not strongly convex under a high-dimensional sampling regime when the Hessian matrix becomes rankdeficient. Under vanilla convexity however, proximal optimization methods attain...

متن کامل

Convergence Rate of an Optimization Algorithm for Minimizing Quadratic Functions with Separable Convex Constraints

A new active set algorithm for minimizing quadratic functions with separable convex constraints is proposed by combining the conjugate gradient method with the projected gradient. It generalizes recently developed algorithms of quadratic programming constrained by simple bounds. A linear convergence rate in terms of the Hessian spectral condition number is proven. Numerical experiments, includi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Comp. Opt. and Appl.

دوره 50  شماره 

صفحات  -

تاریخ انتشار 2011